Spectral-graph Based Classifications: Linear Regression for Classification and Normalized Radial Basis Function Network
نویسندگان
چکیده
Spectral graph theory has been widely applied in unsupervised and semi-supervised learning. It is still unknown how it can be exploited in supervised learning. In this paper, we find for the first time, to our knowledge, that it also plays a concrete role in supervised classification. It turns out that two classifiers are inherently related to the theory: linear regression for classification (LRC) and normalized radial basis function network (nRBFN), corresponding to linear and nonlinear kernel respectively. The spectral graph theory provides us with a new insight into a fundamental aspect of classification: the tradeoff between fitting error and overfitting risk. With the theory, ideal working conditions for LRC and nRBFN are presented, which ensure not only zero fitting error but also low overfitting risk. For quantitative analysis, two concepts, the fitting error and the spectral risk (indicating overfitting), have been defined. Their bounds for nRBFN and LRC are derived. A special result shows that the spectral risk of nRBFN is lower bounded by the number of classes and upper bounded by the size of radial basis. When the conditions are not met exactly, the classifiers will pursue the minimum fitting error, running into the risk of overfitting. It turns out that l2-norm regularization can be applied to control overfitting. Its effect is explored under the spectral context. It is found that the two terms in the l2-regularized objective are one-one correspondent to the fitting error and the spectral risk, revealing a tradeoff between the two quantities. Concerning practical performance, we devise a basis selection strategy to address the main problem hindering the applications of (n)RBFN. With the strategy, nRBFN is easy to implement yet flexible. Experiments on 14 benchmark data sets show the performance of nRBFN is comparable to that of SVM, whereas the parameter tuning of nRBFN is much easier, leading to reduction of model selection time.
منابع مشابه
Neural Network Based Recognition System Integrating Feature Extraction and Classification for English Handwritten
Handwriting recognition has been one of the active and challenging research areas in the field of image processing and pattern recognition. It has numerous applications that includes, reading aid for blind, bank cheques and conversion of any hand written document into structural text form. Neural Network (NN) with its inherent learning ability offers promising solutions for handwritten characte...
متن کاملBuckling of Doubly Clamped Nano-Actuators in General form Through Spectral Meshless Radial Point Interpolation (SMRPI)
The present paper is devoted to the development of a kind of spectral meshless radial point interpolation (SMRPI) technique in order to obtain a reliable approximate solution for buckling of nano-actuators subject to different nonlinear forces. To end this aim, a general type of the governing equation for nano-actuators, containing integro-differential terms and nonlinear forces is considered. ...
متن کاملFast Voltage and Power Flow Contingency Ranking Using Enhanced Radial Basis Function Neural Network
Deregulation of power system in recent years has changed static security assessment to the major concerns for which fast and accurate evaluation methodology is needed. Contingencies related to voltage violations and power line overloading have been responsible for power system collapse. This paper presents an enhanced radial basis function neural network (RBFNN) approach for on-line ranking of ...
متن کاملDeveloping a Radial Basis Function Neural Networks to Predict the Working Days for Tillage Operation in Crop Production
The aim of this study was to determine the probability of working days (PWD) for tillage operation using weather data with Multiple Linear Regression (MLR) and Radial Basis Function (RBF) artificial networks. In both models, seven variables were considered as input parameters, namely minimum, average and maximum temperature, relative humidity, rainfall, wind speed, and evaporation on a daily ba...
متن کاملImproved fast Gauss transform User manual
In most kernel based machine learning algorithms and non-parametric statistics the key computational task is to compute a linear combination of local kernel functions centered on the training data, i.e., f(x) = ∑N i=1 qik(x, xi), which is the discrete Gauss transform for the Gaussian kernel. f is the regression/classification function in case of regularized least squares, Gaussian process regre...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1705.06922 شماره
صفحات -
تاریخ انتشار 2017